707 research outputs found

    Contrastive learning and neural oscillations

    Get PDF
    The concept of Contrastive Learning (CL) is developed as a family of possible learning algorithms for neural networks. CL is an extension of Deterministic Boltzmann Machines to more general dynamical systems. During learning, the network oscillates between two phases. One phase has a teacher signal and one phase has no teacher signal. The weights are updated using a learning rule that corresponds to gradient descent on a contrast function that measures the discrepancy between the free network and the network with a teacher signal. The CL approach provides a general unified framework for developing new learning algorithms. It also shows that many different types of clamping and teacher signals are possible. Several examples are given and an analysis of the landscape of the contrast function is proposed with some relevant predictions for the CL curves. An approach that may be suitable for collective analog implementations is described. Simulation results and possible extensions are briefly discussed together with a new conjecture regarding the function of certain oscillations in the brain. In the appendix, we also examine two extensions of contrastive learning to time-dependent trajectories

    Temporal evolution of generalization during learning in linear networks

    Get PDF
    We study generalization in a simple framework of feedforward linear networks with n inputs and n outputs, trained from examples by gradient descent on the usual quadratic error function. We derive analytical results on the behavior of the validation function corresponding to the LMS error function calculated on a set of validation patterns. We show that the behavior of the validation function depends critically on the initial conditions and on the characteristics of the noise. Under certain simple assumptions, if the initial weights are sufficiently small, the validation function has a unique minimum corresponding to an optimal stopping time for training for which simple bounds can be calculated. There exists also situations where the validation function can have more complicated and somewhat unexpected behavior such as multiple local minima (at most n) of variable depth and long but finite plateau effects. Additional results and possible extensions are briefly discussed

    Neural Networks for Fingerprint Recognition

    Get PDF
    After collecting a data base of fingerprint images, we design a neural network algorithm for fingerprint recognition. When presented with a pair of fingerprint images, the algorithm outputs an estimate of the probability that the two images originate from the same finger. In one experiment, the neural network is trained using a few hundred pairs of images and its performance is subsequently tested using several thousand pairs of images originated from a subset of the database corresponding to 20 individuals. The error rate currently achieved is less than 0.5%. Additional results, extensions, and possible applications are also briefly discussed

    Computing with arrays of coupled oscillators: An application to preattentive texture discrimination

    Get PDF
    Recent experimental findings (Gray et al. 1989; Eckhorn et al. 1988) seem to indicate that rapid oscillations and phase-lockings of different populations of cortical neurons play an important role in neural computations. In particular, global stimulus properties could be reflected in the correlated firing of spatially distant cells. Here we describe how simple coupled oscillator networks can be used to model the data and to investigate whether useful tasks can be performed by oscillator architectures. A specific demonstration is given for the problem of preattentive texture discrimination. Texture images are convolved with different sets of Gabor filters feeding into several corresponding arrays of coupled oscillators. After a brief transient, the dynamic evolution in the arrays leads to a separation of the textures by a phase labeling mechanism. The importance of noise and of long range connections is briefly discussed

    Hybrid modeling, HMM/NN architectures, and protein applications

    Get PDF
    We describe a hybrid modeling approach where the parameters of a model are calculated and modulated by another model, typically a neural network (NN), to avoid both overfitting and underfitting. We develop the approach for the case of Hidden Markov Models (HMMs), by deriving a class of hybrid HMM/NN architectures. These architectures can be trained with unified algorithms that blend HMM dynamic programming with NN backpropagation. In the case of complex data, mixtures of HMMs or modulated HMMs must be used. NNs can then be applied both to the parameters of each single HMM, and to the switching or modulation of the models, as a function of input or context. Hybrid HMM/NN architectures provide a flexible NN parameterization for the control of model structure and complexity. At the same time, they can capture distributions that, in practice, are inaccessible to single HMMs. The HMM/NN hybrid approach is tested, in its simplest form, by constructing a model of the immunoglobulin protein family. A hybrid model is trained, and a multiple alignment derived, with less than a fourth of the number of parameters used with previous single HMMs

    Gagliardo-Nirenberg Inequalities for Differential Forms in Heisenberg Groups

    Full text link
    The L 1-Sobolev inequality states that the L n/(n--1)-norm of a compactly supported function on Euclidean n-space is controlled by the L 1-norm of its gradient. The generalization to differential forms (due to Lanzani & Stein and Bourgain & Brezis) is recent, and states that a the L n/(n--1)-norm of a compactly supported differential h-form is controlled by the L 1-norm of its exterior differential du and its exterior codifferential δ\deltau (in special cases the L 1-norm must be replaced by the H 1-Hardy norm). We shall extend this result to Heisenberg groups in the framework of an appropriate complex of differential forms
    • …
    corecore